Matrix Learning in Learning Vector Quantization
نویسندگان
چکیده
We propose a new matrix learning scheme to extend Generalized Relevance Learning Vector Quantization (GRLVQ), an efficient prototype-based classification algorithm. By introducing a full matrix of relevance factors in the distance measure, correlations between different features and their importance for the classification scheme can be taken into account and automated, general metric adaptation takes place during training. In comparison to the weighted euclidean metric used for GRLVQ, a full matrix is more powerful to represent the internal structure of the data appropriately. Interestingly, large margin generalization bounds can be transfered to the case of a full matrix such that bounds which are independent of the input dimensionality and the number of parameters arise. This also holds for local metrics attached to each prototype. The algorithm is tested and compared to GLVQ without metric adaptation [16] and GRLVQ with diagonal relevance factors using an artificial dataset and the image segmentation data from the UCI repository [15].
منابع مشابه
Learning Matrix Quantization and Variants of Relevance Learning
We propose an extension of the learning vector quantization framework for matrix data. Data in matrix form occur in several areas like gray-scale images, time dependent spectra or fMRI data. If the matrix data are vectorized, important spatial information may be lost. Thus, processing matrix data in matrix form seems to be more appropriate. However, it requires matrix dissimilarities for data c...
متن کاملStationarity of Matrix Relevance Learning Vector Quantization
We investigate the convergence properties of heuristic matrix relevance updates in Learning Vector Quantization. Under mild assumptions on the training process, stationarity conditions can be worked out which characterize the outcome of training in terms of the relevance matrix. It is shown that the original training schemes single out one specific direction in feature space which depends on th...
متن کاملRegularization in Relevance Learning Vector Quantization Using l one Norms
We propose in this contribution a method for l1-regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance profiles. Sparse relevance profiles in hyperspectral data analysis fade down those spectral bands which are not necessary for classification. In particular, we consider the sparsity in the relevance profile enforced by LASSO optimization. The latter...
متن کاملRegularization in relevance learning vector quantization using l1-norms
We propose in this contribution a method for l1-regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance pro les. Sparse relevance pro les in hyperspectral data analysis fade down those spectral bands which are not necessary for classi cation. In particular, we consider the sparsity in the relevance pro le enforced by LASSO optimization. The latter one...
متن کاملDistance Learning in Discriminative Vector Quantization
Discriminative vector quantization schemes such as learning vector quantization (LVQ) and extensions thereof offer efficient and intuitive classifiers based on the representation of classes by prototypes. The original methods, however, rely on the Euclidean distance corresponding to the assumption that the data can be represented by isotropic clusters. For this reason, extensions of the methods...
متن کاملMatrix adaptation in discriminative vector quantization
Discriminative vector quantization schemes such as learning vector quantization (LVQ) and extensions thereof offer efficient and intuitive classifiers which are based on the representation of classes by prototypes. The original methods, however, rely on the Euclidean distance corresponding to the assumption that the data can be represented by isotropic clusters. For this reason, extensions of t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006